356 research outputs found

    Effect of Winter Canola Cultivar on Seed Yield, Oil, and Protein Content

    Get PDF
    Canola (Brassica napus L.) is an oilseed crop that can produce healthy cooking oil and animal feed byproducts. Although it is a relatively new crop, approved for human consumption less than 40 yr ago, advances in breeding have allowed for its production as a winter crop in the southeastern United States. There is little published research, however, related to its performance and quality in this region. Therefore, a study was conducted during the 2014–2015 (Year 1) and 2015–2016 (Year 2) seasons in Tennessee. Twenty-three varieties were planted in a randomized complete block design with four replications across both years to determine seed yield, seed oil, and seed protein content. Differences in fertilizer application rates, planting, and harvest management and differences in weather conditions probably led to significant interactions between years. Cultivar yields ranged from 1269 to 2647 and 1494 to 4199 kg ha−1, seed oil content ranged from 44 to 48% and from 43 to 46%, and seed protein content ranged from 20 to 24% and from 19 to 23% for Years 1 and 2, respectively. In each year, open-pollinated cultivars had significantly lower yield and oil content but significantly greater protein content than hybrid cultivars. There was also a strong negative correlation between seed oil and seed protein and the linear models were significant (r = 0.88, p \u3c 0.0001 for Year 1; r = 0.85, p \u3c 0.0001 for Year 2). Recommended winter canola cultivars include Exp1302 and Hekip

    Simulated Performance of 3-DTI Gamma-Ray Telescope Concepts

    Get PDF
    We present Monte Carlo simulations of two astronomical gamma-ray telescope concepts based on the ThreeDimensional Track Imager (3- DTI) detector. The 3-DTI consists of a time projection chamber with two-dimensional, crossedstrip micro-well detector readout. The full three- dimensional reconstruction of charged-particle tracks in the gas volume is obtained from transient digitizers, which record the time signature of the charge collected in the wells of each strip. Such detectors hold great promise for advanced Compton telescope (ACT) and advanced pair telescope (APT) concepts due to the very precise measurement of charged particle momenta that is possible (Compton recoil electrons and electron-positron pairs, respectively). We have investigated the performance of baseline ACT and APT designs based on the 3-DTI detector using simulation tools based on GEANT3 and GEANT4, respectively. We present the expected imaging, spectroscopy, polarimetry, and background performance of each design

    Medium-Energy Gamma-Ray Astrophysics with the 3-DTI Gamma-Ray Telescope

    Get PDF
    Gamma-ray observations in the medium energy range (0.50-50.0 MeV) are central to unfolding many outstanding questions in astrophysics. The challenges of medium-energy gamma-ray observations, however, are the low photon statistics and large backgrounds. We review these questions, address the telescope technology requirements, and describe our development of the 3-Dimensional Track Imaging (3-DTI) Compton telescope and its performance for a new mediumenergy gamma-ray mission. The 3-DTI is a large-volume time projection chamber (TPC) with a 2-dimensional gas micro-well detector (MWD) readout

    Comparing complex impedance and bias step measurements of Simons Observatory transition edge sensors

    Full text link
    The Simons Observatory (SO) will perform ground-based observations of the cosmic microwave background (CMB) with several small and large aperture telescopes, each outfitted with thousands to tens of thousands of superconducting aluminum manganese (AlMn) transition-edge sensor bolometers (TESs). In-situ characterization of TES responsivities and effective time constants will be required multiple times each observing-day for calibrating time-streams during CMB map-making. Effective time constants are typically estimated in the field by briefly applying small amplitude square-waves on top of the TES DC biases, and fitting exponential decays in the bolometer response. These so-called "bias step" measurements can be rapidly implemented across entire arrays and therefore are attractive because they take up little observing time. However, individual detector complex impedance measurements, while too slow to implement during observations, can provide a fuller picture of the TES model and a better understanding of its temporal response. Here, we present the results of dark TES characterization of many prototype SO bolometers and compare the effective thermal time constants measured via bias steps to those derived from complex impedance data.Comment: 10 pages, 6 figures, SPIE Astronomical Telescopes + Instrumentation 2020, Paper Number: 11453-18

    Insights into the Galactic Cosmic-ray Source from the TIGER Experiment

    Get PDF
    We report results from 50 days of data accumulated in two Antarctic flights of the Trans-Iron Galactic Element Recorder (TIGER). With a detector system composed of scintillators, Cherenkov detectors, and scintillating optical fibers, TIGER has a geometrical acceptance of 1.7 sq m sr and a charge resolution of 0.23 cu at Iron. TIGER has obtained abundance measurements of some of the rare galactic cosmic rays heavier than iron, including Zn, Ga, Ge, Se, and Sr, as well as the more abundant lighter elements (down to Si). The heavy elements have long been recognized as important probes of the nature of the galactic cosmic-ray source and accelerator. After accounting for fragmentation of cosmic-ray nuclei as they propagate through the Galaxy and the atmosphere above the detector system, the TIGER source abundances are consistent with a source that is a mixture of about 20% ejecta from massive stars and 80% interstellar medium with solar system composition. This result supports a model of cosmic-ray origin in OB associations previously inferred from ACE-CRIS data of more abundant lighter elements. These TIGER data also support a cosmic-ray acceleration model in which elements present in interstellar grains are accelerated preferentially compared with those found in interstellar gas

    Development of the Low Frequency Telescope Focal Plane Detector Modules for LiteBIRD

    Full text link
    LiteBIRD is a JAXA-led strategic large-class satellite mission designed to measure the polarization of the cosmic microwave background and Galactic foregrounds from 34 to 448 GHz across the entire sky from L2 in the late 2020s. The scientific payload includes three telescopes which are called the low-, mid-, and high-frequency telescopes each with their own receiver that covers a portion of the mission's frequency range. The low frequency telescope will map synchrotron radiation from the Galactic foreground and the cosmic microwave background. We discuss the design, fabrication, and characterization of the low-frequency focal plane modules for low-frequency telescope, which has a total bandwidth ranging from 34 to 161 GHz. There will be a total of 4 different pixel types with 8 overlapping bands to cover the full frequency range. These modules are housed in a single low-frequency focal plane unit which provides thermal isolation, mechanical support, and radiative baffling for the detectors. The module design implements multi-chroic lenslet-coupled sinuous antenna arrays coupled to transition edge sensor bolometers read out with frequency-domain mulitplexing. While this technology has strong heritage in ground-based cosmic microwave background experiments, the broad frequency coverage, low optical loading conditions, and the high cosmic ray background of the space environment require further development of this technology to be suitable for LiteBIRD. In these proceedings, we discuss the optical and bolometeric characterization of a triplexing prototype pixel with bands centered on 78, 100, and 140 GHz.Comment: SPIE Astronomical Telescope + Instrumentation (AS22

    Pathways between Primary Production and Fisheries Yields of Large Marine Ecosystems

    Get PDF
    The shift in marine resource management from a compartmentalized approach of dealing with resources on a species basis to an approach based on management of spatially defined ecosystems requires an accurate accounting of energy flow. The flow of energy from primary production through the food web will ultimately limit upper trophic-level fishery yields. In this work, we examine the relationship between yield and several metrics including net primary production, chlorophyll concentration, particle-export ratio, and the ratio of secondary to primary production. We also evaluate the relationship between yield and two additional rate measures that describe the export of energy from the pelagic food web, particle export flux and mesozooplankton productivity. We found primary production is a poor predictor of global fishery yields for a sample of 52 large marine ecosystems. However, chlorophyll concentration, particle-export ratio, and the ratio of secondary to primary production were positively associated with yields. The latter two measures provide greater mechanistic insight into factors controlling fishery production than chlorophyll concentration alone. Particle export flux and mesozooplankton productivity were also significantly related to yield on a global basis. Collectively, our analyses suggest that factors related to the export of energy from pelagic food webs are critical to defining patterns of fishery yields. Such trophic patterns are associated with temperature and latitude and hence greater yields are associated with colder, high latitude ecosystems

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    Management Strategy Evaluation: Allowing the Light on the Hill to Illuminate More Than One Species

    Get PDF
    Management strategy evaluation (MSE) is a simulation approach that serves as a “light on the hill” (Smith, 1994) to test options for marine management, monitoring, and assessment against simulated ecosystem and fishery dynamics, including uncertainty in ecological and fishery processes and observations. MSE has become a key method to evaluate trade-offs between management objectives and to communicate with decision makers. Here we describe how and why MSE is continuing to grow from a single species approach to one relevant to multi-species and ecosystem-based management. In particular, different ecosystem modeling approaches can fit within the MSE process to meet particular natural resource management needs. We present four case studies that illustrate how MSE is expanding to include ecosystem considerations and ecosystem models as ‘operating models’ (i.e., virtual test worlds), to simulate monitoring, assessment, and harvest control rules, and to evaluate tradeoffs via performance metrics. We highlight United States case studies related to fisheries regulations and climate, which support NOAA’s policy goals related to the Ecosystem Based Fishery Roadmap and Climate Science Strategy but vary in the complexity of population, ecosystem, and assessment representation. We emphasize methods, tool development, and lessons learned that are relevant beyond the United States, and the additional benefits relative to single-species MSE approaches

    The Origin and Evolution of Mutations in Acute Myeloid Leukemia

    Get PDF
    SummaryMost mutations in cancer genomes are thought to be acquired after the initiating event, which may cause genomic instability and drive clonal evolution. However, for acute myeloid leukemia (AML), normal karyotypes are common, and genomic instability is unusual. To better understand clonal evolution in AML, we sequenced the genomes of M3-AML samples with a known initiating event (PML-RARA) versus the genomes of normal karyotype M1-AML samples and the exomes of hematopoietic stem/progenitor cells (HSPCs) from healthy people. Collectively, the data suggest that most of the mutations found in AML genomes are actually random events that occurred in HSPCs before they acquired the initiating mutation; the mutational history of that cell is “captured” as the clone expands. In many cases, only one or two additional, cooperating mutations are needed to generate the malignant founding clone. Cells from the founding clone can acquire additional cooperating mutations, yielding subclones that can contribute to disease progression and/or relapse
    • 

    corecore